Home |
| Latest | About | Random
# Post-script: where to go from here? There is a lot more to linear algebra of course. After learning about orthogonal diagonalizability, one can look into something called *singular value decomposition*, or *SVD* for short. One can analyze "the most important component" in a matrix (in economics, SVD is also known as PCA -- principle component analysis). One can use SVD to do lossy image compression (see [https://dmicz.github.io/machine-learning/svd-image-compression/](https://dmicz.github.io/machine-learning/svd-image-compression/)); or even voting analysis -- by analyzing just a binary information of a person's vote, one can categorize them into groups of similar alignments (see: [https://saweis.net/svd/](https://saweis.net/svd/)) Also, orthogonality and symmetric matrices also can be seen in the study of *quadratic forms*, a function of the form $q(x)=x^{T}Ax$, where $A$ is some $n\times n$ real symmetric matrix and $x$ a vector of $\mathbb R^{n}$. This is important in all aspects in math (calculus, algebra and number theory, geometry...). For instance, in calculus, if $f:\mathbb R^{n} \to \mathbb R$ is a sufficiently differentiable function at $p$, then the Taylor expansion of $f$ at $p$ is $$ f(p+x) = f(p)+J_{p}(x) +\frac{1}{2} x^{T}H_{p}x + \cdots $$where $J_{p}$ is the Jacobian matrix of $f$ at $p$, and $H_{p}$ is the Hessian matrix of $f$ at $p$. Notice that if the first partial derivatives of $f$ all vanishes, then we have $$ f(p+x)-f(p) \approx \frac{1}{2} x^{T}H_{p}x +\cdots $$approximately a quadratic form! Using this idea one can deduce the so-called *second derivative test* ([https://mathworld.wolfram.com/SecondDerivativeTest.html](https://mathworld.wolfram.com/SecondDerivativeTest.html)) for functions on a plane, that often mystifies students where it comes from. Finally, we did not focus too much if at all on this, but the idea of orthogonality can extend beyond our usual vectors, but to functions as well. This gives a foundation to the theory of Fourier series ([https://mathworld.wolfram.com/FourierSeries.html](https://mathworld.wolfram.com/FourierSeries.html)), an important idea to study functions as a series of trigonometric functions. And of course, you might have heard buzz words like machine learning, neural network etc. The basis of these is that there is some "weights" in layers of linear transformations plus some nonlinearity, to predict input-output pairs. In any case, if you are a student of mathematics, here is road map of topics to get started, divided into rough categories (though as typical in math, things generally overlap heavily with one another): - Analysis. - Real analysis, metric space and topology - Complex analysis - Basic ordinary and partial differential equations - Algebra. - Linear algebra - Theory of groups, rings, and fields - Commutative algebra - Geometry and topology. - Classical and synthetic geometry - Topology - Classical differential geometry - Algebraic topology, algebraic geometry - Combinatorics and number theory. - Counting and generating functions - Graph theory - Algorithms and complexity - Probability - Basic number theory - Applied mathematics. - Numerical methods in linear algebra - Numerical analysis - Modeling - Optimization - Statistics - Foundations and abstract nonsense - Set theory and logic - Category theory (although one would rightfully put this in algebra) - .... The list is long and not exhaustive, but this should get you started on your exciting journey! I wish you the very best! ![[smc-spring-2024-math-13/linear-algebra-notes/---files/third-eye.png]]